Optimizer (machine learning)

A specific implementation of the gradient descent algorithm. Popular optimizers include:1

  • AdaGrad, which stands for ADAptive GRADient descent.
  • Adam, which stands for ADAptive with Momentum.

Footnotes

  1. developers.google.com/machine-learning/glossary#optimizer

2024 © ak